Look 'hear', primary auditory cortex is active during lip-reading.
نویسندگان
چکیده
INTRODUCTION A key mechanistic principle of the nervous system is one of initial segmentation, whereby each environmental object or event is subdivided into its elemental parts. For example, in the visual system, features such as colour, location, motion and texture are analyzed by largely separate cortical regions. Nowhere is the segregation of inputs more obvious than across the sensory systems where the sensory epithelia are so very different and are contained in completely separate sensory organs (ears, eyes, skin, nose, etc.). A key question is how a nervous system that apparently operates by dividing its inputs, ultimately provides the unitary coherent and flowing perceptual world that most of us experience: the so-called binding problem. Some understanding is beginning to emerge regarding how objects and events are built-up again within individual sensory systems [1]. Far less is known, however, about how it is that inputs from the various sensory systems interact across the ascending levels of the cortical (and perhaps even subcortical) processing hierarchies. In this issue, Pekkola and colleagues [2] present neuroimaging data that clearly show that watching the lip movements of a speaker, in the absence of any auditory stimulation, results in activation of the primary auditory cortex. These data appear to convincingly resolve a longrunning debate as some groups had suggested involvement of the primary auditory region in speech-reading [3] whereas others had claimed that only secondary auditory regions were involved [4]. However, none of the former studies used strict anatomical mapping techniques to isolate primary auditory cortex on an individual subject basis and as such, primary cortical involvement has remained under dispute. By anatomically defining primary auditory cortex for individual subjects using high-resolution MRI reconstructions, sensitivity was greatly increased in the Pekkola study. This allowed them to show clear primary auditory activation in seven of their ten subjects during silent speechreading, while nine of the 10 subjects showed activation of more lateral auditory regions along Heschl’s gyrus. Visually driven activation of a non-visual primary sensory region is in opposition to the general conception of how primary sensory regions operate – that is, that they receive exclusively unisensory inputs – and many in the field will be surprised to learn that primary sensory cortex is in fact penetrable by information from another sensory system. Of course, a major drawback of the neuroimaging methodology used in Pekkola’s study is that it cannot resolve the route by which visually driven activation of auditory cortex occurs, or the timing of the effects, but there are several distinct possibilities. One of these is that the effect is driven by topdown inputs from known higher order multisensory regions following extensive processing in lower tier unisensory areas: a feedback model. This is the routing system that has been favored in the literature [5]. An alternative, but more controversial model, is that there are inputs to early sensory regions from other sensory systems that affect stimulus processing in a feedforward manner [6,8], although these latter studies have assessed only basic audio-visual stimuli and it is probable that language based stimuli will involve more complex processing routines. It remains to be investigated, using the high temporal resolution of electrophysiological techniques, when during the timecourse of primary auditory cortical processing, processing of visual articulatory gestures has its effect on speech processing. One question that arises from the present findings is just how specific to speech stimuli such activation of auditory cortex is? For example, recent data has shown audio-visual interactions in primary auditory cortex when letter stimuli are presented in conjunction with their spoken counterparts [9]. However, in support of a relatively specific role in speech-reading, presentation of the visual letter stimuli in isolation produced no activation of primary auditory cortex. In a similar vein, electrophysiological studies have shown clear (and early) modulation of auditory event-related potential components by non-speech visual stimuli, suggesting that not all visual modulation of auditory processing needs to be speech-based [7]. However, the left hemisphere dominance of the responses seen in Pekkola’s study suggests that they were driven by the speech content of the stimuli, given the well-known specialization of the left hemisphere for language processing. Indeed, they also show COMMENTARY NEUROREPORT
منابع مشابه
Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf.
Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional MRI to study seven prelingual deaf sig...
متن کاملAttention fine-tunes auditory-motor processing of speech sounds.
The earliest stages of cortical processing of speech sounds take place in the auditory cortex. Transcranial magnetic stimulation (TMS) studies have provided evidence that the human articulatory motor cortex contributes also to speech processing. For example, stimulation of the motor lip representation influences specifically discrimination of lip-articulated speech sounds. However, the timing o...
متن کاملThe effect of lip-reading on primary stream segregation.
Lip-reading has been shown to improve the intelligibility of speech in multitalker situations, where auditory stream segregation naturally takes place. This study investigated whether the benefit of lip-reading is a result of a primary audiovisual interaction that enhances the obligatory streaming mechanism. Two behavioral experiments were conducted involving sequences of French vowels that alt...
متن کامللبخوانی و ادراک گفتار دانشآموزان کمشنوای مدارس ویژۀ کمشنوایان در شهر تهران
Objective: The goal of this study was to evaluate the lip reading ability and Speech perception of hearing impaired students of special schools for the hearing impaired in different speech levels. Materials & Methods: In this cross- sectional study, 44 deaf students (9-12 years old) were selected with multi-stage cluster sampling method, from two special schools for the deaf in Tehran. Tools...
متن کاملHow silent is silent reading? Intracerebral evidence for top-down activation of temporal voice areas during reading.
As you might experience it while reading this sentence, silent reading often involves an imagery speech component: we can hear our own "inner voice" pronouncing words mentally. Recent functional magnetic resonance imaging studies have associated that component with increased metabolic activity in the auditory cortex, including voice-selective areas. It remains to be determined, however, whether...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neuroreport
دوره 16 2 شماره
صفحات -
تاریخ انتشار 2005